# Autoregressive Language Model
Ko En Llama2 13b
A Korean-English bilingual autoregressive language model based on LLaMA2-13B architecture, focusing on Korean corpus learning while retaining English capabilities
Large Language Model
Transformers Korean

K
hyunseoki
1,850
27
Molm 700M 4B
Apache-2.0
MoLM is a series of language models based on the Mixture of Experts (MoE) architecture. The 700M-4B version has a total of 4 billion parameters, with computational consumption equivalent to a dense model of 700 million parameters.
Large Language Model
Transformers

M
ibm-research
36
6
Polyglot Ko 5.8b
Apache-2.0
Polyglot-Ko-5.8B is a large-scale Korean autoregressive language model developed by EleutherAI's multilingual team, with 5.8 billion parameters trained on 863GB of Korean data.
Large Language Model
Transformers Korean

P
EleutherAI
1,148
65
GPT Neo 2.7B Janeway
MIT
A sci-fi/fantasy-themed language model fine-tuned based on GPT-Neo 2.7B
Large Language Model
Transformers English

G
KoboldAI
93
6
GPT Neo 2.7B Picard
MIT
A sci-fi themed language model fine-tuned from GPT-Neo 2.7B, excels in generating science fiction texts
Large Language Model
Transformers English

G
KoboldAI
40
7
Featured Recommended AI Models